40 research outputs found

    Vibration Monitoring: Gearbox identification and faults detection

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    ANOVA and other statistical tools for bearing damage detection

    Get PDF
    The aim of the paper is to exhaustively exploit and test some statistical tools, such as ANOVA and Linear Discriminant Analysis, to investigate a massive amounts of data collected over a rig available @DIRG Lab, specifically conceived to test high speed aeronautical bearings; the rig permits the control of rotational speed (6000 – 30000 RPM), radial load (0 to 1800 N) and temperature, and allows monitoring vibrations by means of 4 tri-axial accelerometers. Fifteen different damages have been realised on the bearing but, for simplicity, this papers only treats those cases where simple identification methods have failed or not demonstrated to be fully affordable. The damages have been inferred on rolls or on the internal ring, with different severities, which are reported as a function of their extension, i.e. 150, 250, 450 μm. A total number of 17 combinations of load and speed have been analysed per each damaged bearing. Although ANOVA rigorously applies when some conditions are respected on the probability distribution of the responses, such as Independence of observations, Normality (normal distribution of the residuals) and Homoscedasticity (homogeneity of variances – equal variances), the paper exploits the robustness of the technique even when data do not fully fall into the requisites. Analyses are focused on the best features to be taken into account, trying to seek for the most informative, but also trying to extract a “best choice” for the acceleration direction and the most informative point to be monitored over the simple structure. Wanting to focus on the classification of the single observation, Linear Discriminant Analysis has been tested, demonstrating to be quite effective as the number of misclassification is not very high, (at least considering the widest damages). All these classifications have unfortunately the limit of requiring labelled examples. Acquisitions in un- damaged and damaged conditions are in fact essential to guarantee their applicability, which is quite often impossible for real industrial plants. The target can be anyway reached by adopting distances from un-damaged conditions which, conversely, must be known as a reference. Advantages of the statistical methods are quickness, simplicity and full independence from human interaction

    Performance of Envelope Demodulation for Bearing Damage Detection on {CWRU} Accelerometric Data: Kurtogram and Traditional Indicators vs. Targeted a Posteriori Band Indicators

    Get PDF
    Envelope demodulation of vibration signals is surely one of the most successful methods of analysis for highlighting diagnostic information of rolling element bearings incipient faults. From a mathematical perspective, the selection of a proper demodulation band can be regarded as an optimization problem involving a utility function to assess the demodulation performance in a particular band and a scheme to move within the search space of all the possible frequency bands {f, Df} (center frequency and band size) towards the optimal one. In most of cases, kurtosis-based indices are used to select the proper demodulation band. Nevertheless, to overcome the lack of robustness to non-Gaussian noise, different utility functions can be found in the literature. One of these is the kurtosis of the unbiased autocorrelation of the squared envelope of the filtered signal found in the autogram. These heuristics are usually sufficient to highlight the defect spectral lines in the demodulated signal spectrum (i.e., usually the squared envelope spectrum (SES)), enabling bearings diagnostics. Nevertheless, it is not always the case. In this work, then, posteriori band indicators based on SES defect spectral lines are proposed to assess the general envelope demodulation performance and the goodness of traditional indicators. The CaseWestern Reserve University bearing dataset is used as a test case

    Least squares smoothed k-nearest neighbors online prediction of the remaining useful life of a NASA turbofan

    Get PDF
    An accurate prediction of the Remaining Useful Life (RUL) of aircraft engines plays a fundamental role in the aerospace field since it is both mission and safety critical. In fact, a reliable estimate of the RUL can effectively reduce the maintenance costs while fostering safety. This paper proposes a novel data-driven method to increase accuracy of the RUL prediction for real-time prognostic systems, considering multiple degradation mechanisms and making the model easy to implement. The proposed method exploits a novel modified k-Nearest Neighbors Interpolation (kNNI) with an a posteriori Least Square Smoothing (LSS) automatically optimized to obtain the minimum prediction error. The LSS novel formulation was also generalized and proved to be equivalent to a Cumulative and Moving Average (CMA) mixture filter, which can be easily implemented online. The method was developed and validated based on a new NASA dataset generated by the dynamic model Commercial Modular Aero-Propulsion System Simulation (N-CMAPSS) with run-to-failure data related to a small fleet of aircraft engines under realistic flight conditions. Finally, a refer- ence kNN-based method already known in the literature was compared to the novel proposed one to demonstrate the goodness of the results and the performance improvements

    Big Data management: A Vibration Monitoring point of view

    Get PDF
    Vibration Monitoring is a particular kind of Condition Monitoring meant to infer the state of health of a machine from accelerometric measurements. From a practical point of view, the scope is then to extract from the acceleration data some valuable diagnostic information which could be used to detect the presence of possible damages (i.e., to produce knowledge about the state of health). When the monitoring is implemented online, in a continuous way, the raw accelerometric data sets can be very large and complex to be dealt with, as usually involve multiple channels (i.e., multiple locations and directions) and high sample rates (i.e., order of ksps - 103 samples per second), but the final knowledge about the state of health can, in principle, be summarized by a single binary information (i.e., healthy – 0 vs damaged – 1). This is commonly called Damage Detection. In this work, the big data management challenge is tackled from the point of view of statistical signal processing, so as to aggregate the multivariate data and condense them into single information of distance with respect to a healthy reference condition (i.e., the Novelty). When confounding influences (such as the work condition or the environmental condition) can be disregarded, the novelty information has a direct correspondence to the health information, so that an alarm indicating the detection of damage can be triggered upon exceeding a selected threshold for the limit novelty. Many different ways of solving such a binary classification problem can be found in the literature. Starting from the simplest, some of the more effective are compared in the present analysis, to finally select a reliable procedure for the big data management in vibration monitoring

    ON THE USE OF PCA FOR DIAGNOSTICS VIA NOVELTY DETECTION: INTERPRETATION, PRACTICAL APPLICATION NOTES AND RECOMMENDATION FOR USE

    Get PDF
    The Principal Component Analysis (PCA) is the simplest eigenvector-based multivariate data analysis tool and dates back to 1901 when Karl Pearson proposed it as a way for finding the best fitting d-1 hyperplane of a system of points in a d-dimensional (Euclidean) space. Over time, the PCA evolved in different fields with several different names and with different scopes, but, in its essence, it is always an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. Generalizing Pearson’s purpose, the knowledge derived by such an analysis is mostly used to find a subspace which effectively and efficiently summarizes the original system of points by losing a minimum amount of information. In the field of Diagnostics, the fundamental task of detecting damage is basically a binary classification problem which is in many cases tackled via Novelty Detection: an observation is classified as novel if it differs significantly from other observations. Novelty can, in principle, be assessed directly in the original space, but the effectiveness of the estimated novelty can be improved by taking advantage of the PCA. In this work, the traditional PCA will be compared to a robust modification that is commonly used in the field of diagnostics to face the issue of confounding influences which could affect the novelty-damage correspondence. Comparisons will be made to shed light on the main misleading aspects of PCA, and finally, define a unique, theoretically justified procedure for Diagnostics via Novelty Detection

    From Novelty Detection to a Genetic Algorithm Optimized Classification for the Diagnosis of a SCADA-Equipped Complex Machine

    Get PDF
    In the field of Diagnostics, the fundamental task of detecting damage is basically a binary classification problem, which is addressed in many cases via Novelty Detection (ND): an observation is classified as novel if it differs significantly from reference, healthy data. ND is practically implemented summarizing a multivariate dataset with univariate distance information called Novelty Index. As many different approaches are possible to produce NIs, in this analysis, the possibility of implementing a simple classifier in a reduced-dimensionality space of NIs is studied. In addition to a simple decision-tree-like classification method, the process for obtaining the NIs can result as a dimension reduction method and, in turn, the NIs can be used for other classification algorithms. In addition, a case study will be analyzed thanks to the data published by the Prognostics and Health Management Europe (PHME) society, on the occasion of the Data Challenge 2021

    Confounding factors analysis and compensation for highspeed bearing diagnostics

    Get PDF
    In recent years, machine diagnostics through vibration monitoring is gaining a rising interest. Indeed, in the literature many advanced techniques are available to disclose the fault establishment as well as damage type, location and severity. Unfortunately, in general, these high-level algorithms are not robust to operational and environmental variables, restricting the field of applicability of machine diagnostics. Most of industrial machines in fact, work with variable loads, at variable speeds and in uncontrolled environments, so that the finally measured signals are often non-stationary. The very common time-series features based on statistical moments (such as root mean square, skewness, kurtosis, peak value and crest factor) undergo variations related to changes in the machine operational parameters (e.g. speed, load, …) or in the environmental parameters (e.g. temperature, humidity, …), which can be seen as non-measured, and then latent, confounding factors with respect to the health information of interest. In order to face such issue, statistical techniques like (in a first exploratory stage) the Principal Component Analysis, or the Factor Analysis, are available. The pursuit of features insensitive to these factors, can be also tackled exploiting the cointegration property of non-stationary signals. In this paper, the most common methods for reducing the influence of latent factors are considered, and applied to investigate the data collected over the rig available at the DIRG laboratory, specifically conceived to test high speed aeronautical bearings monitoring vibrations by means of 2 tri-axial accelerometers while controlling the rotational speed (0 – 30000 RPM), the radial load (0 to 1800 N) and recording the lubricant oil temperature. The compensation scheme is based on two procedures which are established in univariate analyses, but not so well documented in multivariate cases: the removal of deterministic trends by subtraction of a regression, and the removal of stochastic trends in difference stationary series by subtraction of the one-step ahead prediction from an autoregressive model. The extension of these methods to the multivariate case is here analysed to find an effective way of enhancing damage patterns when the masking effect due to the non-stationarities induced by latent factors is strong

    Fast Computation of the Autogram for the Detection of Transient Faults

    Get PDF
    Structures and machines maintenance is a hot topic, as their failure can be both expensive and dangerous. Condition-based maintenance regimes are ever more desired so that cost-effective, reliable, and damage-responsive diagnostics techniques are needed. Among the others, Vibration Monitoring using accelerometers is a very little invasive technique that can in principle detect also small, incipient damages. Focusing on transient faults, one reliable processing to highlight their presence is the Envelope analysis of the vibration signal filtered in a band of interest. The challenge of selecting an appropriate band for the demodulation is an optimization problem requiring two ingredients: a utility function to evaluate the performance in a particular band, and a scheme to move within the search space of all the possible center frequencies and band sizes (the dyad {f, Δf}) toward the optimal. These problems were effectively tackled by the Kurtogram, a brute-force computation of the kurtosis of the envelope of the filtered signal (the utility function) of every possible {f, Δf} combination. The complete exploration of the whole plane (f, Δf) is a heavy task which compromises the computational efficiency of the algorithm so that the analysis on a discrete (f, Δf) paving was implemented (Fast Kurtogram). To overcome the lack of robustness to non-Gaussian noise, different utility functions were proposed. One is the kurtosis of the unbiased autocorrelation of the squared envelope of the filtered signal found in the Autogram. To spread this improved algorithm in on-line industrial applications, a fast implementation of the Autogram is proposed in this pape
    corecore